📚 node [[weight|weight]]
Welcome! Nobody has contributed anything to 'weight|weight' yet. You can:
  • Write something in the document below!
    • There is at least one public document in every node in the Agora. Whatever you write in it will be integrated and made available for the next visitor to read and edit.
  • Write to the Agora from social media.
    • If you follow Agora bot on a supported platform and include the wikilink [[weight|weight]] in a post, the Agora will link it here and optionally integrate your writing.
  • Sign up as a full Agora user.
    • As a full user you will be able to contribute your personal notes and resources directly to this knowledge commons. Some setup required :)
⥅ related node [[wikilinks amount to a lightweight universal standard for knowledge interoperability]]
⥅ related node [[upweighting]]
⥅ related node [[weight]]
⥅ related node [[weighted_alternating_least_squares_(wals)]]
⥅ node [[weight]] pulled by Agora

weight

Go back to the [[AI Glossary]]

A coefficient for a feature in a linear model, or an edge in a deep network. The goal of training a linear model is to determine the ideal weight for each feature. If a weight is 0, then its corresponding feature does not contribute to the model.

⥅ node [[weighted_alternating_least_squares_(wals)]] pulled by Agora

Weighted Alternating Least Squares (WALS)

Go back to the [[AI Glossary]]

#recsystems

An algorithm for minimizing the objective function during matrix factorization in recommendation systems, which allows a downweighting of the missing examples. WALS minimizes the weighted squared error between the original matrix and the reconstruction by alternating between fixing the row factorization and column factorization. Each of these optimizations can be solved by least squares convex optimization. For details, see the Recommendation Systems course

📖 stoas
⥱ context